Dynamics 您所在的位置:网站首页 operation app Dynamics

Dynamics

2023-03-19 14:13| 来源: 网络整理| 查看: 265

Overview

SQLToLake V2 is a generic sample solution to export SQLServer (on-premise or Azure SQL) tables data to Azure Data lake Gen 2 storage account in Common data model format. Solution utilize Azure data factory pipelines and Azure function based on CDM SDK to copy SQL tables data and generate CDM metadata to Azure storage account. Solution can also read the CDM manifest recursively and create view on Synapse Analytics SQL-On-Demand database.

Use cases

You can use this Data factory solution for following use cases

Ingest on-premise SQL Database or Azure SQL database to Azure data lake in CDM format Ingest your on-premise Dynamics AX data to Azure data lake in CDM format Ingest Finance and Operations app data from Cloud Hosted Dev Environment or Tier 2 environment to Azure data lake in CDM format (A workaround to Tables in Data Lake feature to build POC) SQLToADLS Full Export V2 Highlights

If you are new to Azure Data Factory, Azure functions and Synapse Analytics, you should try Version 1 first to get familier with some of the basic concepts. Version 2 of Data factory solution, primarly utilize C# solution that is based on CDM SDK deployed as Azure function to automate some of the manual steps to create views in Synapse analytics. This version also generate folder structure and metadata similar to F&O Tables in Lake feature. Following are some highlights of this updated version of data factory solution

Generate Finance and Operations CDM folder structure in data lake Automatically partition data for large tables Geneate metadata in Manifest format. Read metadata (manifest) and create views in SQLOn-Demand Prerequisites

Azure subscription. You will require contributor access to an existing Azure subscription. If you don't have an Azure subscription, create a free Azure account before you begin.

Azure storage account. If you don't have a storage account, see Create an Azure storage account for steps to create one.

Synapse Analytics Workspace create synapse workspace

Connect to SQL-On-Demand endpoint: Once you provisioned Synapse workspace, you can use Synapse Studio or SQL Server Management Studio (SSMS 18.5 or higher) or Azure Data Studio. For details check supported tools

Azure data factory - Create an Azure Data Factory resource follow the steps to create a Data factory

Install Visual Studio 2019: to build and deploy C# solution as Azure function APP (Download and .NET46 framework install https://dotnet.microsoft.com/download/dotnet-framework/thank-you/net462-developer-pack-offline-installer)

Deployment steps Deploy C# Solution as Azure function Clone the repository Dynamics-365-FastTrack-Implementation-Assets Open C# solution Microsoft.CommonDataModel.sln in Visual Studio 2019 and build Publish the CDMUtil_AzureFunctions Project as Azure function Right-click on the project CDMUtil_AzureFunctions from Solution Explorer and select "Publish". Select Azure as Target and selct Azure Function Apps ( Windows) Click Create new Azure Function App and select subscription and resource group to create Azure function app Click Publish Open Azure Portal and locate Azure Function App created. Enable MSI go to Identity tab enable System managed identity (/Analytics/EnableMSI.PNG) To learn more aboubt Azure functions follow the link Azure functions in Visual Studio Setup Storage Account In Azure portal, go to Storage account and grant Blob Data Contributor and Blob Data Reader access to Azure Function App MSI Create a container dynamics365-financeandoperations Create a folder under container to represent your environment name ie - analyticsPerf87dd1496856e213a.cloudax.dynamics.com Setup Azure Synapse Analytics and SQL-On-Demand(Serverless)

ADF solution can also read the CDM metadata and create views on SQL-OnDemand.To use the automation complete following steps

Create Database: Before you can create views and external table to query data using TSQL, you need to create Database. CREATE DATABASE AXDB Create datasource: -- create master key that will protect the credentials: CREATE MASTER KEY ENCRYPTION BY PASSWORD = -- create credentials for containers in our demo storage account -- Replace your storage account location and provide shared access signature CREATE DATABASE SCOPED CREDENTIAL myenvironment WITH IDENTITY='SHARED ACCESS SIGNATURE', SECRET = 'sv=2018-03-28&ss=bf&srt=sco&sp=rl&st=2019-10-14T12%3A10%3A25Z&se=2061-12-31T12%3A10%3A00Z&sig=KlSU2ullCscyTS0An0nozEpo4tO5JAgGBvw%2FJX2lguw%3D' GO CREATE EXTERNAL DATA SOURCE myenvironmentds WITH ( LOCATION = 'https://mydatalake.dfs.core.windows.net/dynamics365-financeandoperations/myenvironment.cloudax.dynamics.com', CREDENTIAL = myenvironment ); Create Azure Function MSI App as User on SQL: Make sure you use an AAD login for the connection to SQL-OnDemand, because otherwise you will receive an error that only AAD logins can perform this action. use AXDB go -- replace with your Azure functionName Create user [ABCCDMUtilAzureFunctions] FROM EXTERNAL PROVIDER; alter role db_owner Add member [ABCCDMUtilAzureFunctions]

For more details follow the documentation First time setup

Collect Azure Data Factory Deployment Parameters

Once Azure function is deployed and Storage account is ready, collect all the parameters as described bellow

Login to Azure portal and navigate to Azure Storage account and notedown following Storage account>Properties>Data Lake storage>Primary endpoint Data Lake storage - example https://yourdatalakestoraheURU.dfs.core.windows.net/ Storage account> Access keys > Key1> Key - example XXXXXXXXXXXXXXXXXXXX/XXXXXXXXXXXXXXXXXXXXXXXXXXX== Note down Azure data factory name that you created earlier Note down your Source SQL server database connection string - ex data source=dbserver;initial catalog=axdb;user id=sqladmin;password=PassWord. Navigate to Function App Deployed earlier steps and Notedown Azure function URL : Function App > Overview > URL example:https://msftcdmutilazurefunction.azurewebsites.net and Function App Key: Function App > App Keys > Host Keys> Value (Name= _master) Here are all the parameters that you need to deploy Azure data factory Parameter name Description Example factoryName Name of your data factory SQLToDataLake StorageAccount name of the storage account d365fodatalakestorage DataLakeGen2Storage_accountKey storage account access key Secret key Container storage account container dynamics365-financeandoperations RootFolder root folder jjd365fo2d9ba7ea6d7563beaos.cloudax.dynamics.com SQLDB_connectionString SourceSQL DB connection string data source=dbservername.database.windows.net;initial catalog=databasename;user id=userid;password=PassWord CDMUTIL_functionAppUrl Azure function URI https://cdmutil.azurewebsites.net CDMUTIL_functionKey App keys Access key TenantId TenandId Guid DataSourceName DataSourceName on SQL-On-Demand Datasource name created on SQL-OnDemand SQLEndPoint Connection details of SQL-On-Demand Server=d365ftazuresynapsedemo-ondemand.sql.azuresynapse.net;Database=AXDB Deploy Azure Data Factory Template Download the ARM template file to your local directory. Click [Template deployment] https://ms.portal.azure.com/#create/Microsoft.Template Click Build your own template in the editor option Click load file and locate the ARM template file you downloaded ealrier and click Save. Provide required parameters and Review + create. Connecting to Finance and Operations Cloud Hosted Tier 1 Environment or Sandbox Tier 2

If your source system is Dynamics 365 for Finance and Operations Cloud Hosteed Dev or Tier 2 environment. You can get the database connection details from Life Cycle Services Environment details page. You would need Environment Manager or Project Owner access in LCS to see the database connection details.

Cloud Hosted Tier 1 environment To Connect Azure data factory to Dynamics 365 for Finance and Operations Cloud Hosted Development environment you need to create Self-Hosted integration runtime for your Azure data factory. Follow the documentation link to install and configure Self-Hosted Integration runtime on the VM Create a Self-hosted integration runtime Change the integration runtime for your SQLServerDB link services, validate connection and deploy changes to your data factory. Connecting to Tier 2 environment To connect Azure data factory to tier 2 environment you dont need Self-Hosted Integration Runtime as Tier 2 envirinment uses Azure SQL However Tier 2 Azure SQL Database are firewall enabled so you have to whitelist the IP address of Azure data factory to connect Follow the documentation Connecting to Tier 2 database . Note that Self-Service database connections are only valid for 8 hours. So you have to updated the database crededential in the Data factory connection before excutin Execute pipelines

Once Azure data factory template deployed successfully, navigate to Azure Data Factory solution and execute pipelines. Find your Azure Data Factory in the Azure portal, and select the Author & Monitor tile.

Execute pipeline SQLTablesToADLS to exort data create CDM metadata files. Validate the CDM folder structure and metadata.

If you are missing the manifest files, most likely Azure function failed to authenticate to Storage account. To troubleshoot, validate the Azure function configuration and Storage account access control. You can debug Azure function locally in Visual studio. Postman collection template can be found under /SQLToADLSFullExport/CDMUtil.postman_collection

Execute pipeline CreateView to create the views. Validate view created on Synapse SQL-On-Demand

To learn SQL-On-Demand concepts in details follow the blog post or use the documentation page Synapse Analytics documentation

Build and serve report

Once you created views on SQL-On-Demand to read your tables data stored in data lake, you can use any reporting and BI tool such as Excel, SQL Server Reporting services or Power BI to connect to SQL-On_Demand endpoint just like any other Azure SQL database and build reports. Documentation shows how to connect Power BI with SQL-On-Demand endpoint

Troubleshooting If your pipleline fails on the Azure function calls, validate your Azure function configuration. you can also debug C# code by running the CDMUtil_AzureFunctions locally and PostMan - Postman template can be found under /SQLToADLSFullExport/CDMUtil.postman_collection you can find input parameters for Azure function in Azure data factory pipeline execution history. Using CDMUtil Solution Locally Right Click on Project CDMUtil_AzureFunctions and set as Startup project In Visual Studio select account for authentication Tools>Options>Azure Service Authentication> Account selection Selected account must have Storage Blob Data Contributor and Storage Blob Data Reader access on the Storage account and AAD access on the SQL-On-Demand endpoint Click Debug - Azure Function CLI will open up Download and install Postman if you dont have it already. Import PostmanCollection Collection contains request for all methods with sample header value Change the header values as per your environment for requests and send the request In case error you can debug the local code.


【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有